Goto

Collaborating Authors

 underrepresented candidate


Why Technology Alone Can't Solve AI's Bias Problem - HBS Working Knowledge

#artificialintelligence

In a cluttered online world, few can resist the convenience of an automated ranking when deciding what movie to watch on Netflix or which seafood restaurant looks promising in a Google search. But when it comes to finding a job candidate or someone to do a basic household task, there's often a human toll to letting algorithms do the work. Searches on popular recruiting sites might seem like a neutral way to find prospective candidates, but their underlying technology can reinforce biases by excluding underrepresented groups, including women. For instance, research shows that women receive fewer employment reviews on the popular online freelancing site TaskRabbit compared to men with the same experience--and this lack of reviews can lower the rankings of women in talent search algorithms. "Maybe there is a bias from people who have been traditionally hiring men," explains Himabindu Lakkaraju, an assistant professor at Harvard Business School.


How machine learning can course-correct inherent biases in recruiting

#artificialintelligence

Machine learning can help mitigate the biases present within organisations' recruiting practices Artificial intelligence has often been portrayed as dystopian when it comes to human resources. In one famous example from 2018, Amazon used it significantly in the hiring process but ultimately had to pull the plug when it was revealed that the algorithm was biased against women. The AI was identifying candidates who used masculine words as successful candidates, and instead of addressing this flaw, it reinforced sexism. Yet technology has come a long way in just the last few years. Machine learning is now being used to tackle the problem of bias within hiring decisions, not just looking coldly at performance metrics.


Understanding Algorithmic Biases & Its Impact On Online Hiring

#artificialintelligence

The last few years have seen a dramatic increase when it comes to online job hiring platforms. For instance, popular hiring platforms like LinkedIn, TaskRabbit, among others, have been playing a significant role in employing millions of job seekers. These platforms are well known for one more reason, which is the use of automated tools and machine learning techniques across the platform. These days, ranking algorithms are being widely employed by various online job platforms in order to determine how job seekers are presented to potential employers. The hiring platforms use such tools to meet certain job-specific criteria, such as requirements, work experience and other such.


The Effect of the Rooney Rule on Implicit Bias in the Long Term

Celis, L. Elisa, Hays, Chris, Mehrotra, Anay, Vishnoi, Nisheeth K.

arXiv.org Artificial Intelligence

A robust body of evidence demonstrates the adverse effects of implicit bias in various contexts--from hiring to health care. The Rooney Rule is an intervention developed to counter implicit bias and has been implemented in the private and public sectors. The Rooney Rule requires that a selection panel include at least one candidate from an underrepresented group in their shortlist of candidates. Recently, Kleinberg and Raghavan proposed a model of implicit bias and studied the effectiveness of the Rooney Rule when applied to a single selection decision. However, selection decisions often occur repeatedly over time. Further, it has been observed that, given consistent counterstereotypical feedback, implicit biases against underrepresented candidates can change. We consider a model of how a selection panel's implicit bias changes over time given their hiring decisions either with or without the Rooney Rule in place. Our main result is that, when the panel is constrained by the Rooney Rule, their implicit bias roughly reduces at a rate that is the inverse of the size of the shortlist--independent of the number of candidates, whereas without the Rooney Rule, the rate is inversely proportional to the number of candidates. Thus, when the number of candidates is much larger than the size of the shortlist, the Rooney Rule enables a faster reduction in implicit bias, providing an additional reason in favor of using it as a strategy to mitigate implicit bias. Towards empirically evaluating the long-term effect of the Rooney Rule in repeated selection decisions, we conduct an iterative candidate selection experiment on Amazon MTurk. We observe that, indeed, decision-makers subject to the Rooney Rule select more minority candidates in addition to those required by the rule itself than they would if no rule is in effect, and do so without considerably decreasing the utility of candidates selected.